AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Low-resource BERT

# Low-resource BERT

Neurobert
MIT
NeuroBERT is a lightweight natural language processing model based on BERT, optimized for resource-constrained devices, suitable for edge computing and IoT scenarios.
Large Language Model Transformers
N
boltuix
231
9
Bert Tiny Finetuned Sst2
This model is based on the BERT-tiny architecture and fine-tuned on the SST-2 dataset using the M-FAC second-order optimizer for text classification tasks.
Text Classification Transformers
B
M-FAC
59
0
Bert Mini Finetuned Squadv2
This model is based on the BERT-mini architecture, fine-tuned on the SQuAD 2.0 dataset using the M-FAC second-order optimizer for question answering tasks.
Question Answering System Transformers
B
M-FAC
17
0
Bert Mini Finetuned Sst2
This model is a BERT-mini model fine-tuned on the SST-2 dataset using the M-FAC second-order optimizer for text classification tasks.
Text Classification Transformers
B
M-FAC
13.90k
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase